FedCVT: Semi-supervised Vertical Federated Learning with Cross-view Training

نویسندگان

چکیده

Federated learning allows multiple parties to build machine models collaboratively without exposing data. In particular, vertical federated (VFL) enables participating a joint model based upon distributed features of aligned samples. However, VFL requires all share sufficient amount reality, the set samples may be small, leaving majority non-aligned data unused. this article, we propose Cross-view Training (FedCVT), semi-supervised approach that improves performance with limited More specifically, FedCVT estimates representations for missing features, predicts pseudo-labels unlabeled expand training set, and trains three classifiers jointly different views expanded improve model's performance. does not require their original parameters, thus preserving privacy. We conduct experiments on NUS-WIDE, Vehicle, CIFAR10 datasets. The experimental results demonstrate significantly outperforms vanilla only utilizes Finally, perform ablation studies investigate contribution each component FedCVT.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Training SpamAssassin with Active Semi-supervised Learning

Most spam filters include some automatic pattern classifiers based on machine learning and pattern recognition techniques. Such classifiers often require a large training set of labeled emails to attain a good discriminant capability between spam and legitimate emails. In addition, they must be frequently updated because of the changes introduced by spammers to their emails to evade spam filter...

متن کامل

Active + Semi-supervised Learning = Robust Multi-View Learning

In a multi-view problem, the features of the domain can be partitioned into disjoint subsets (views) that are sufficient to learn the target concept. Semi-supervised, multi-view algorithms, which reduce the amount of labeled data required for learning, rely on the assumptions that the views are compatible and uncorrelated (i.e., every example is identically labeled by the target concepts in eac...

متن کامل

Semi-Supervised Learning with Very Few Labeled Training Examples

In semi-supervised learning, a number of labeled examples are usually required for training an initial weakly useful predictor which is in turn used for exploiting the unlabeled examples. However, in many real-world applications there may exist very few labeled training examples, which makes the weakly useful predictor difficult to generate, and therefore these semisupervised learning methods c...

متن کامل

Semi-Supervised Active Learning with Cross-Class Sample Transfer

To save the labeling efforts for training a classification model, we can simultaneously adopt Active Learning (AL) to select the most informative samples for human labeling, and Semi-supervised Learning (SSL) to construct effective classifiers using a few labeled samples and a large number of unlabeled samples. Recently, using Transfer Learning (TL) to enhance AL and SSL, i.e., T-SS-AL, has gai...

متن کامل

Semi-Supervised Learning with Trees

We describe a nonparametric Bayesian approach to generalizing from few labeled examples, guided by a larger set of unlabeled objects and the assumption of a latent tree-structure to the domain. The tree (or a distribution over trees) may be inferred using the unlabeled data. A prior over concepts generated by a mutation process on the inferred tree(s) allows efficient computation of the optimal...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Transactions on Intelligent Systems and Technology

سال: 2022

ISSN: ['2157-6904', '2157-6912']

DOI: https://doi.org/10.1145/3510031